Goto

Collaborating Authors

 zhao song









Total Least Squares Regression in Input Sparsity Time

Huaian Diao, Zhao Song, David Woodruff, Xin Yang

Neural Information Processing Systems

In the total least squares problem, one is given an m n matrix A, and an m d matrix B, and one seeks to "correct" both A and B, obtaining matrices  and B, so that there exists an X satisfying the equation ÂX = B. Typically the problem is overconstrained, meaning that m max(n, d).


c164bbc9d6c72a52c599bbb43d8db8e1-Paper.pdf

Neural Information Processing Systems

Deep neural networks have achieved impressive performance in many areas. Designing a fast and provable method for training neural networks is a fundamental question in machine learning. The classical training method requires paying Ω(mnd) cost for both forward computation and backward computation, where m is the width of the neural network, and we are given n training points in d-dimensional space.


Networks

Neural Information Processing Systems

Despite being powerful and well-understood, the kernel ridge regression suffers from the costly computation when dealing with large datasets, since generally implementation of Eq. (1) requires O(n3) running time.